Correlated Parameters to Accurately Measure Uncertainty in Deep Neural Networks

نویسندگان

چکیده

In this article, a novel approach for training deep neural networks using Bayesian techniques is presented. The methodology allows an easy evaluation of model uncertainty and, additionally, robust to overfitting. These are commonly the two main problems classical, i.e., non-Bayesian architectures have struggle with. proposed applies variational inference in order approximate intractable posterior distribution. particular, distribution defined as product multiple multivariate normal distributions with tridiagonal covariance matrices. Every single belongs either weights or biases corresponding one network layer. layerwise posteriori variances based on expectation values, and furthermore, correlations assumed be identical. Therefore, only few additional parameters need optimized compared settings. performance new evaluated other recently developed methods. Basis evaluations popular benchmark data sets MNIST CIFAR-10. Among considered approaches, shows best predictive accuracy. Moreover, extensive provided prediction information indicate that often yields more useful estimates than comparison

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Uncertainty propagation through deep neural networks

In order to improve the ASR performance in noisy environments, distorted speech is typically pre-processed by a speech enhancement algorithm, which usually results in a speech estimate containing residual noise and distortion. We may also have some measures of uncertainty or variance of the estimate. Uncertainty decoding is a framework that utilizes this knowledge of uncertainty in the input fe...

متن کامل

Representing inferential uncertainty in deep neural networks through sampling

As deep neural networks (DNNs) are applied to increasingly challenging problems, they will need to be able to represent their own uncertainty. Modeling uncertainty is one of the key features of Bayesian methods. Scalable Bayesian DNNs that use dropout-based variational distributions have recently been proposed. Here we evaluate the ability of Bayesian DNNs trained with Bernoulli or Gaussian dis...

متن کامل

Robustly representing uncertainty through sampling in deep neural networks

As deep neural networks (DNNs) are applied to increasingly challenging problems, they will need to be able to represent their own uncertainty. Modelling uncertainty is one of the key features of Bayesian methods. Using Bernoulli dropout with sampling at prediction time has recently been proposed as an efficient and well performing variational inference method for DNNs. However, sampling from ot...

متن کامل

Correction: Measuring Fisher Information Accurately in Correlated Neural Populations

Neural responses are known to be variable. In order to understand how this neural variability constrains behavioral performance, we need to be able to measure the reliability with which a sensory stimulus is encoded in a given population. However, such measures are challenging for two reasons: First, they must take into account noise correlations which can have a large influence on reliability....

متن کامل

Measuring Fisher Information Accurately in Correlated Neural Populations

Neural responses are known to be variable. In order to understand how this neural variability constrains behavioral performance, we need to be able to measure the reliability with which a sensory stimulus is encoded in a given population. However, such measures are challenging for two reasons: First, they must take into account noise correlations which can have a large influence on reliability....

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: IEEE transactions on neural networks and learning systems

سال: 2021

ISSN: ['2162-237X', '2162-2388']

DOI: https://doi.org/10.1109/tnnls.2020.2980004